Taylor Expansion for the Entropy Rate of Hidden Markov Chains
Authors
Abstract:
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate for the entropy rate of hidden Markov chain by matrix algebra and its spectral representation. To do so, we use the Taylor expansion, and calculate some estimates for the first and the second terms, for the entropy rate of the hidden Markov process and its binary version, respectively. For small varepsilon (channel's parameter), the entropy rate has o(varepsilon^2), as a maximum error, when it is calculated by the first term of Taylor expansion and it has o(varepsilon^3) , as a maximum error, when it is calculated by the second term.
similar resources
The Rate of Rényi Entropy for Irreducible Markov Chains
In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.
full textEntropy Rate for Hidden Markov Chains with rare transitions
We consider Hidden Markov Chains obtained by passing a Markov Chain with rare transitions through a noisy memoryless channel. We obtain asymptotic estimates for the entropy of the resulting Hidden Markov Chain as the transition rate is reduced to zero. Let (Xn) be a Markov chain with finite state space S and transition matrix P (p) and let (Yn) be the Hidden Markov chain observed by passing (Xn...
full textAnalyticity of Entropy Rate of Continuous-State Hidden Markov Chains
We prove that under certain mild assumptions, the entropy rate of a hidden Markov chain, observed when passing a finite-state stationary Markov chain through a discretetime continuous-output channel, is jointly analytic as a function of the input Markov chain parameters and the channel parameters. In particular, as consequences of the main theorems, we obtain analyticity for the entropy rate as...
full textRelative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...
full textEstimation of the Entropy Rate of ErgodicMarkov Chains
In this paper an approximation for entropy rate of an ergodic Markov chain via sample path simulation is calculated. Although there is an explicit form of the entropy rate here, the exact computational method is laborious to apply. It is demonstrated that the estimated entropy rate of Markov chain via sample path not only converges to the correct entropy rate but also does it exponential...
full textThe Relative Entropy Rate For Two Hidden Markov Processes
The relative entropy rate is a natural and useful measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discr...
full textMy Resources
Journal title
volume 7 issue 2
pages 103- 120
publication date 2011-03
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023